Quantifying the complexity of neural network output using entropy measures

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

assessment of the efficiency of s.p.g.c refineries using network dea

data envelopment analysis (dea) is a powerful tool for measuring relative efficiency of organizational units referred to as decision making units (dmus). in most cases dmus have network structures with internal linking activities. traditional dea models, however, consider dmus as black boxes with no regard to their linking activities and therefore do not provide decision makers with the reasons...

scour modeling piles of kambuzia industrial city bridge using hec-ras and artificial neural network

today, scouring is one of the important topics in the river and coastal engineering so that the most destruction in the bridges is occurred due to this phenomenon. whereas the bridges are assumed as the most important connecting structures in the communications roads in the country and their importance is doubled while floodwater, thus exact design and maintenance thereof is very crucial. f...

Quantifying Neural Correlations Using Lempel-ziv Complexity

Spike train analysis generally focuses on two aims: (1) the estimate of the neuronal information quantity, and (2) the quantification of spikes or bursts synchronization. We introduce here a new multivariate index based on LempelZiv complexity for spike train analysis. This index, called mutual Lempel-Ziv complexity (MLZC), can both measure spikes correlations and estimate the information carri...

متن کامل

Entropy Measures vs. Kolmogorov Complexity

Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for Rényi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we s...

متن کامل

Measures of maximal entropy

We extend the results of Walters on the uniqueness of invariant measures with maximal entropy on compact groups to an arbitrary locally compact group. We show that the maximal entropy is attained at the left Haar measure and the measure of maximal entropy is unique.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: BMC Neuroscience

سال: 2009

ISSN: 1471-2202

DOI: 10.1186/1471-2202-10-s1-p322